Extending Kernel Fisher Discriminant Analysis with the Weighted Pairwise Chernoff Criterion

نویسندگان

  • Guang Dai
  • Dit-Yan Yeung
  • Hong Chang
چکیده

Many linear discriminant analysis (LDA) and kernel Fisher discriminant analysis (KFD) methods are based on the restrictive assumption that the data are homoscedastic. In this paper, we propose a new KFD method called heteroscedastic kernel weighted discriminant analysis (HKWDA) which has several appealing characteristics. First, like all kernel methods, it can handle nonlinearity efficiently in a disciplined manner. Second, by incorporating a weighting function that can capture heteroscedastic data distributions into the discriminant criterion, it can work under more realistic situations and hence can further enhance the classification accuracy in many real-world applications. Moreover, it can effectively deal with the small sample size problem. We have performed some face recognition experiments to compare HKWDA with several linear and nonlinear dimensionality reduction methods, showing that HKWDA consistently gives the best results.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Quasiconformal Kernel Fisher Discriminant Analysis via Weighted Maximum Margin Criterion

Kernel Fisher discriminant analysis (KFD) is an effective method to extract nonlinear discriminant features of input data using the kernel trick. However, conventional KFD algorithms endure the kernel selection problem as well as the singular problem. In order to overcome these limitations, a novel nonlinear feature extraction method called adaptive quasiconformal kernel Fisher discriminant ana...

متن کامل

Uncorrelated heteroscedastic LDA based on the weighted pairwise Chernoff criterion

We propose an uncorrelated heteroscedastic LDA (UHLDA) technique, which extends the uncorrelated LDA (ULDA) technique by integrating the weighted pairwise Chernoff criterion. The UHLDA can extract discriminatory information present in both the differences between per class means and the differences between per class covariance matrices. Meanwhile, the extracted feature components are statistica...

متن کامل

Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria

ÐWe derive a class of computationally inexpensive linear dimension reduction criteria by introducing a weighted variant of the well-known K-class Fisher criterion associated with linear discriminant analysis (LDA). It can be seen that LDA weights contributions of individual class pairs according to the Euclidian distance of the respective class means. We generalize upon LDA by introducing a dif...

متن کامل

Kernel-based Weighted Discriminant Analysis with QR Decomposition and Its Application to Face Recognition

Kernel discriminant analysis (KDA) is a widely used approach in feature extraction problems. However, for high-dimensional multi-class tasks, such as faces recognition, traditional KDA algorithms have a limitation that the Fisher criterion is non-optimal with respect to classification rate. Moreover, they suffer from the small sample size problem. This paper presents two variants of KDA called ...

متن کامل

Chernoff Dimensionality Reduction-Where Fisher Meets FKT

Well known linear discriminant analysis (LDA) based on the Fisher criterion is incapable of dealing with heteroscedasticity in data. However, in many practical applications we often encounter heteroscedastic data, i.e., within-class scatter matrices can not be expected to be equal. A technique based on the Chernoff criterion for linear dimensionality reduction has been proposed recently. The te...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006